93 research outputs found

    On some interrelations of generalized qq-entropies and a generalized Fisher information, including a Cram\'er-Rao inequality

    Get PDF
    In this communication, we describe some interrelations between generalized qq-entropies and a generalized version of Fisher information. In information theory, the de Bruijn identity links the Fisher information and the derivative of the entropy. We show that this identity can be extended to generalized versions of entropy and Fisher information. More precisely, a generalized Fisher information naturally pops up in the expression of the derivative of the Tsallis entropy. This generalized Fisher information also appears as a special case of a generalized Fisher information for estimation problems. Indeed, we derive here a new Cram\'er-Rao inequality for the estimation of a parameter, which involves a generalized form of Fisher information. This generalized Fisher information reduces to the standard Fisher information as a particular case. In the case of a translation parameter, the general Cram\'er-Rao inequality leads to an inequality for distributions which is saturated by generalized qq-Gaussian distributions. These generalized qq-Gaussians are important in several areas of physics and mathematics. They are known to maximize the qq-entropies subject to a moment constraint. The Cram\'er-Rao inequality shows that the generalized qq-Gaussians also minimize the generalized Fisher information among distributions with a fixed moment. Similarly, the generalized qq-Gaussians also minimize the generalized Fisher information among distributions with a given qq-entropy

    An amended MaxEnt formulation for deriving Tsallis factors, and associated issues

    Get PDF
    An amended MaxEnt formulation for systems displaced from the conventional MaxEnt equilibrium is proposed. This formulation involves the minimization of the Kullback-Leibler divergence to a reference QQ (or maximization of Shannon QQ-entropy), subject to a constraint that implicates a second reference distribution P_1P\_{1} and tunes the new equilibrium. In this setting, the equilibrium distribution is the generalized escort distribution associated to P_1P\_{1} and QQ. The account of an additional constraint, an observable given by a statistical mean, leads to the maximization of R\'{e}nyi/Tsallis QQ-entropy subject to that constraint. Two natural scenarii for this observation constraint are considered, and the classical and generalized constraint of nonextensive statistics are recovered. The solutions to the maximization of R\'{e}nyi QQ-entropy subject to the two types of constraints are derived. These optimum distributions, that are Levy-like distributions, are self-referential. We then propose two `alternate' (but effectively computable) dual functions, whose maximizations enable to identify the optimum parameters. Finally, a duality between solutions and the underlying Legendre structure are presented.Comment: Presented at MaxEnt2006, Paris, France, july 10-13, 200

    Quelques inégalités caractérisant les gaussiennes généralisées

    Get PDF
    International audienceDans cet article, on propose de caractériser les gaussiennes généralisées comme les densités atteignant les bornes de plusieurs extensions d'inégalités connues en théorie de l'information. On retrouve en cas particulier les résultats pour la gaussienne standard. Abstract - In this paper, we propose to characterize a class of generalized Gaussian distributions as the densities that saturate several extensions of classical information theoretic inequalities. The results for the standard Gaussian are recovered as a particular case

    On escort distributions, q-gaussians and Fisher information

    Get PDF
    International audienceEscort distributions are a simple one parameter deformation of an original distribution p. In Tsallis extended thermostatistics, the escort‐averages, defined with respect to an escort distribution, have revealed useful in order to obtain analytical results and variational equations, with in particular the equilibrium distributions obtained as maxima of RĂ©nyi‐Tsallis entropy subject to constraints in the form of a q‐average. A central example is the q‐gaussian, which is a generalization of the standard gaussian distribution. In this contribution, we show that escort distributions emerge naturally as a maximum entropy trade‐off between the distribution p(x) and the uniform distribution. This setting may typically describe a phase transition between two states. But escort distributions also appear in the fields of multifractal analysis, quantization and coding with interesting consequences. For the problem of coding, we recall a source coding theorem by Campbell relating a generalized measure of length to the RĂ©nyi‐Tsallis entropy and exhibit the links with escort distributions together with pratical implications. That q‐gaussians arise from the maximization of RĂ©nyi‐Tsallis entropy subject to a q‐variance constraint is a known fact. We show here that the (squared) q‐gaussian also appear as a minimum of Fisher information subject to the same q‐variance constraint

    Escort entropies and divergences and related canonical distribution

    Get PDF
    arXiv : 1109.3311International audienceWe discuss two families of two-parameter entropies and divergences, derived from the standard RĂ©nyi and Tsallis entropies and divergences. These divergences and entropies are found as divergences or entropies of escort distributions. Exploiting the nonnegativity of the divergences, we derive the expression of the canonical distribution associated to the new entropies and a observable given as an escort-mean value. We show that this canonical distribution extends, and smoothly connects, the results obtained in nonextensive thermodynamics for the standard and generalized mean value constraints

    Entropies et critĂšres entropiques

    Get PDF
    International audienceCe chapitre est centrĂ© sur les notions d'entropies et de lois Ă  maximum d'entropie qui seront caractĂ©risĂ©es selon plusieurs angles. Au delĂ  des liens avec les applications en ingĂ©nierie puis en physique, on montrera qu'il est possible de bĂątir des fonction-nelles rĂ©gularisantes fondĂ©es sur l'emploi d'une technique Ă  maximum d'entropie, qui peuvent alors Ă©ventuellement ĂȘtre utilisĂ©es comme potentiels ad hoc dans des pro-blĂšmes d'inversion de donnĂ©es. Le chapitre dĂ©bute par un tour d'horizon des principales propriĂ©tĂ©s des mesures d'information, et par l'introduction de diffĂ©rentes notions et dĂ©finitions. En particu-lier, on dĂ©finit la divergence de RĂ©nyi, on prĂ©sente la notion de distribution escorte, et on commente le principe du maximum d'entropie qui sera utilisĂ© par la suite. On prĂ©sente ensuite un problĂšme classique d'ingĂ©nierie, le problĂšme du codage de source, et on montre l'intĂ©rĂȘt d'utiliser des mesures de longueur diffĂ©rentes de la mesure stan-dard, et en particulier une mesure exponentielle, qui conduit Ă  un thĂ©orĂšme de codage de source dont la borne minimale est une entropie de RĂ©nyi. On montre Ă©galement que les codes optimaux peuvent ĂȘtre calculĂ©s aisĂ©ment grĂące aux distributions escortes. En section 1.4, on introduit et on Ă©tudie un modĂšle simple de transition d'Ă©tat. Ce modĂšle conduit Ă  une distribution d'Ă©quilibre dĂ©finie comme une distribution escorte gĂ©nĂ©ra-lisĂ©e, et conduit en sous-produit, Ă  nouveau Ă  une entropie de RĂ©nyi. On Ă©tudie le flux d'information de Fisher le long de la courbe dĂ©finie par la distribution escorte gĂ©nĂ©-ralisĂ©e, et on obtient des connections avec la divergence de Jeffreys. Finalement, on obtient diffĂ©rents arguments qui, dans ce cadre, conduisent Ă  une mĂ©thode d'infĂ©rence fondĂ©e sur la minimisation de l'entropie de RĂ©nyi sous une contrainte de moyenne gĂ©nĂ©ralisĂ©e, i.e. prise vis-Ă -vis de la distribution escorte. À partir de la section 1.5.3, on s'intĂ©resse alors Ă  la minimisation de la divergence de RĂ©nyi sous une contraint

    On generalized Cramér-Rao inequalities, generalized Fisher informations and characterizations of generalized q-Gaussian distributions

    Get PDF
    International audienceThis paper deals with Cramér-Rao inequalities in the context of nonextensive statistics and in estimation theory. It gives characterizations of generalized q-Gaussian distributions, and introduces generalized versions of Fisher information. The contributions of this paper are (i) the derivation of new extended Cramér-Rao inequalities for the estimation of a parameter, involving general q-moments of the estimation error, (ii) the derivation of Cramér-Rao inequalities saturated by generalized q-Gaussian distributions, (iii) the definition of generalized Fisher informations, (iv) the identification and interpretation of some prior results, and finally, (v) the suggestion of new estimation methods

    À propos d’extensions de l’information de Fisher et de l’inĂ©galitĂ© de CramĂ©r-Rao

    Get PDF
    International audienceRĂ©sumĂ© – On prĂ©sente quelques extensions de rĂ©sultats classiques de thĂ©orie de l'estimation, conduisant Ă  dĂ©finir de nouvelles formes de l'information de Fisher et Ă  des gĂ©nĂ©ralisations de la borne de CramĂ©r-Rao. On indique que ceci suggĂšre une variation autour du maximum de vraisemblance. Ce cadre conduit Ă©galement Ă  une nouvelle caractĂ©risation des gaussiennes gĂ©nĂ©ralisĂ©es, ainsi qu'Ă  l'extension de certaines identitĂ©s (de Bruijn) et inĂ©galitĂ©s (relations d'incertitudes). On retrouve en cas particulier les rĂ©sultats habituels pour la gaussienne standard. Abstract – We present some extensions of classical results from estimation theory. This leads to new forms of Fisher information and t

    Envelope and phase delays correction in an EER radio architecture

    Get PDF
    International audienceThis article deals with synchronization in the Envelope Elimination and Restoration (EER) type of transmitter architecture. To illustrate the performances of such solution, we choose to apply this architecture to a 64 carriers 16QAM modulated OFDM. We first introduce the problematic of the realisation of a highly linear transmitter.We then present the Envelope Elimination and Restoration solution and draw attention to its major weakness: a high sensitivity to desynchronization between the phase and envelope signal paths. To address this issue, we propose an adaptive synchronization algorithm relying on a feedback loop, a LeastMean Square formulation and involving an interpolation step. It enables the correction of delay mismatches and tracking of possible variations. We demonstrate that the quality of the interpolator has a direct impact on Error Vector Magnitude (EVM) value and output spectrum. Implementation details are provided along with an analysis of the behaviour and performances of the method. We present HPADS and Matlab simulation results and then focus on the enhancement of the transmitter performances using the proposed algorithm
    • 

    corecore